perm filename MINIMI[E82,JMC] blob sn#672102 filedate 1982-08-06 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00004 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	minimi[e82,jmc][e82,jmc]		Minimizing logical expressions
C00005 00003	Do we really need all three of  x  y z?
C00007 00004	The propositional case
C00013 ENDMK
C⊗;
minimi[e82,jmc][e82,jmc]		Minimizing logical expressions

	The general form of logical minimization may be the following:

We have an expression  A  whose truth is to be maintained - the axiom in
case of circumscription.  A finite number of such expressions can be
combined into one by conjunction, but there could be a significant
generalization when there are an infinite number of axioms.

We have a wff  E(x1,...,xm,y1,...,yn,z1,...,zp), summarized as  E(x,y,z).

We wish to minimize  λx.E(x,y,z)  with respect to the partial ordering

	λx.E(x,y,z) ≤ λx.E(x,y',z') ≡ ∀x.E(x,y,z) ⊃ E(x,y',z').

The variables  x  are called Pareto variables after the economist
who defined what is called Pareto optimum states of the economy,
wherein the situation cannot be improved for some people without
making it worse for others.  Pareto optimization can also apply
among the different aspects of a situation from the point of view
of a single person.

	The variables  y  are called
parameters and are to be held fixed.  The variables  z  are to be
chosen to minimize  λx.E(x,y,z)  holding the  y  variables fixed and
keeping the expression  A  true.   In the circumscription examples
(McCarthy 1980),
the  x  variables have been individual, and the  y  and   z
variables have been predicate variables, and  E  has had the
special form  P(x,y),  where  P  is one of the  z  variables.

	Before considering the problem in general, we consider some
special cases.

	1. The simplest case is when there are no  x  variables.
Our goal is then to make the expression  E(y,z)  false if possible,
while holding  A  true.  It therefore reduces to the satisfiability
of  A ∧ ¬E(y,z).

Do we really need all three of  x  y z?

1. The part of the Lagrange multiplier for which there may be
an analogy is that when the expressions held constant multiplied
by the Lagrange multipliers are added to the expression to be
minimized, we get an expression to be minimized absolutely.  Perhaps
there is some way of turning logical relative minimization problems
into absolute minimization problems.

2. Bossu and Siegel "inferior implication" is monotonic on positive
formulas.  Is this partially true of circumscription?

3. Bossu and Siegel show that every set of clauses is minimally
modelable.  How about extending this to a more general notion
of minimization?

4. A discriminant interpretation is one in which all ground terms
correspond to distinct individuals.   Herbrand interpretations are
discriminant.
The propositional case

1. If we have a single propositional expression  s  to minimize, it
is simply a question of whether  ¬s  is satisfiable.

2. Suppose now we have the expression  e(p) = s1(p)∧q ∨ s2(p)∧¬q, 
and we consider  e(p) ≤ e(p')  only if it's true for both
q  true and  q  false.  We win big if  ¬s1(p)∧¬s2(p)  is
satisfiable, because then we can make  e(p)  false independent
of  q.  Otherwise, we ask whether either  ¬s1(p)  or  ¬s2(p)  is
satisfiable.  Solutions of either give minimizations.  If neither
is satisfiable, then  e(p)  is a tautology and any values of  p
are equally good.

3. With two  q  variables, we have

e(p,q) = s11(p)∧q1∧q2 ∨ s12(p)∧q1∧¬q2 ∨ s21(p)∧¬q1∧q2 ∨ s22(p)∧¬q1∧¬q2.

We must first ask if

¬s11(p)∧¬s12(p)∧¬s21(p)∧¬s22(p)

is satisfiable; if so, its solutions give the minimizations.  Otherwise,
the next step is to look for triples that are satisfiable, etc.
In general, the minimizations are given by tuples of the  s's  whose
negations are simultaneously satisfiable and are not included in
larger tuples that are simultaneously satisfiable.

Thus we see that the problem of minimization in the propositional case
is an extension of the problem of satisfaction, and gives rise to
patterns of satisfaction problems.

The problem just treated may be called absolute minimization, because
there is no axiom to be kept true.  Suppose there is such an axiom.  It
can be written

a11(p)∧q1∧q2 ∨ a12(p)∧q1∧¬q2 ∨ etc.

Since it must be true for all values of  q1  and  q2, we must choose
p  subject to the restriction

a11(p)∧a12(p)∧a21(p)∧a22(p).

Thus the axiom really depends only on  p  and can be written  a(p).
Then the satisfaction problems mentioned  above merely have  a(p)
conjoined to them.  For example, the first possibility is that

a(p)∧¬s11(p)∧¬s12(p)∧¬s21(p)∧¬s22(p)

is satisfiable.


Minimization of functions of individual variables

	The minimization problems raised by circumscription are second
order problems, because the entities varied in order to perform the
minimizations are predicates.  The corresponding first order problem
is the following:

We have a predicate  P(x,y)  and we order values of  x  by the
relation

x ≤ x' ≡ ∀y.P(x,y) ⊃ P(x',y).

Then the condition for a minimum  x0  subject to the constraint  a(x)
becomes

a(x0)  ∧ ∀x.a(x) ∧ ∀y.(P(x,y) ⊃ P(x0,y)) ⊃ ∀y.(P(x0,y) ⊃ P(x,y)).


Remarks:

1. While the propositional case and the first order case are logically
simpler than the circumscription case, we don't have any AI or
philosophical examples that call for these kinds of minimization.
More thought may reveal some, and independently of that, there
may be some interesting mathematical properties.

2. Carolyn Talcott suggests the following in the context of circumsccription,
but it may be an interesting question in the more general case also.
Suppose we have a consistent collection of sentences  A, and  A1  and
A2  are subcollections such that the (second order)
minimization formula  p1  of some expression  E
with  A1  as axioms is consistent with  A  and the minimization
formula  p2  of  E  with  A2  as axioms is consistent with  A.
Then when is  p1∧p2∧A  consistent?  Might it be always consistent?